A fast dual proximal gradient algorithm for convex minimization and applications
نویسندگان
چکیده
We consider the convex composite problem of minimizing the sum of a strongly convex function and a general extended valued convex function. We present a dual-based proximal gradient scheme for solving this problem. We show that although the rate of convergence of the dual objective function sequence converges to the optimal value with the rate O(1/k2), the rate of convergence of the primal sequence is of the order O(1/k). © 2013 Elsevier B.V. All rights reserved.
منابع مشابه
Convergence rate of a proximal multiplier algorithm for separable convex minimization
This paper analyze the convergence rate of a proximal algorithm called Proximal Multiplier Algorithm with Proximal Distances (PMAPD), proposed by us in [20], to solve convex minimization problems with separable structure. We prove that, under mild assumptions, its primal-dual sequences converge linearly to the optimal solution for a class of proximal distances .
متن کاملThe proximal-proximal gradient algorithm
We consider the problem of minimizing a convex objective which is the sum of a smooth part, with Lipschitz continuous gradient, and a nonsmooth part. Inspired by various applications, we focus on the case when the nonsmooth part is a composition of a proper closed convex function P and a nonzero affine map, with the proximal mappings of τP , τ > 0, easy to compute. In this case, a direct applic...
متن کاملFast Convex Optimization Algorithms for Exact Recovery of a Corrupted Low-rank Matrix
This paper studies algorithms for solving the problem of recovering a low-rank matrix with a fraction of its entries arbitrarily corrupted. This problem can be viewed as a robust version of classical PCA, and arises in a number of application domains, including image processing, web data ranking, and bioinformatic data analysis. It was recently shown that under surprisingly broad conditions, it...
متن کاملSpectral projected gradient and variable metric methods for optimization with linear inequalities
A family of variable metric methods for convex constrained optimization was introduced recently by Birgin, Mart́ınez and Raydan. One of the members of this family is the Inexact Spectral Projected Gradient (ISPG) method for minimization with convex constraints. At each iteration of these methods a strictly convex quadratic function with convex constraints must be (inexactly) minimized. In the ca...
متن کاملExploiting Strong Convexity from Data with Primal-Dual First-Order Algorithms
We consider empirical risk minimization of linear predictors with convex loss functions. Such problems can be reformulated as convex-concave saddle point problems, and thus are well suitable for primal-dual first-order algorithms. However, primal-dual algorithms often require explicit strongly convex regularization in order to obtain fast linear convergence, and the required dual proximal mappi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Oper. Res. Lett.
دوره 42 شماره
صفحات -
تاریخ انتشار 2014